EN FR
EN FR


Section: Partnerships and Cooperations

International Initiatives

Inria Associate Teams

VERSAMUS

Participants : Emmanuel Vincent, Nobutaka Ito, Gabriel Sargent, Frédéric Bimbot, Rémi Gribonval.

  • Title: Integrated probabilistic music representations for versatile music content processing

  • Inria principal investigator: Emmanuel Vincent

  • International Partner (Institution - Laboratory - Researcher):

    • Tokyo University (Japan) - Department of Physics and Computing

  • Duration: 2010 - 2012

  • See also: http://versamus.inria.fr/

  • Music plays a major role in everyday use of digital media contents. Companies and users are waiting for smart content creation and distribution functionalities, such as music classification, search by similarity, summarization, chord transcription, remixing and automatic accompaniment. So far, research efforts have focused on the development of specific algorithms and corpora for each functionality based on low-level sound features characterizing sound as a whole. Yet, music generally results from the superposition of heterogeneous sound components (e.g. voices, pitched musical instruments, drums, sound samples) carrying interdependent features at several levels (e.g. music genre, singer identity, melody, lyrics, voice signal). Integrated music representations combining all feature levels would make it possible to address all of the above functionalities with increased accuracy as well as to visualize and interact with the content in a musically relevant manner. The aim of this project was to investigate, design and validate such representations in the framework of Bayesian data analysis, which provides a rigorous way of combining separate feature models in a modular fashion. Tasks addressed in the project have included the design of a versatile model structure, of a library of feature models and of efficient algorithms for parameter inference and model selection.